80 research outputs found

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    A mathematical model for breath gas analysis of volatile organic compounds with special emphasis on acetone

    Full text link
    Recommended standardized procedures for determining exhaled lower respiratory nitric oxide and nasal nitric oxide have been developed by task forces of the European Respiratory Society and the American Thoracic Society. These recommendations have paved the way for the measurement of nitric oxide to become a diagnostic tool for specific clinical applications. It would be desirable to develop similar guidelines for the sampling of other trace gases in exhaled breath, especially volatile organic compounds (VOCs) which reflect ongoing metabolism. The concentrations of water-soluble, blood-borne substances in exhaled breath are influenced by: (i) breathing patterns affecting gas exchange in the conducting airways; (ii) the concentrations in the tracheo-bronchial lining fluid; (iii) the alveolar and systemic concentrations of the compound. The classical Farhi equation takes only the alveolar concentrations into account. Real-time measurements of acetone in end-tidal breath under an ergometer challenge show characteristics which cannot be explained within the Farhi setting. Here we develop a compartment model that reliably captures these profiles and is capable of relating breath to the systemic concentrations of acetone. By comparison with experimental data it is inferred that the major part of variability in breath acetone concentrations (e.g., in response to moderate exercise or altered breathing patterns) can be attributed to airway gas exchange, with minimal changes of the underlying blood and tissue concentrations. Moreover, it is deduced that measured end-tidal breath concentrations of acetone determined during resting conditions and free breathing will be rather poor indicators for endogenous levels. Particularly, the current formulation includes the classical Farhi and the Scheid series inhomogeneity model as special limiting cases.Comment: 38 page

    Evolutionary, ecological and biotechnological perspectives on plasmids resident in the human gut mobile metagenome

    Get PDF
    Numerous mobile genetic elements (MGE) are associated with the human gut microbiota and collectively referred to as the gut mobile metagenome. The role of this flexible gene pool in development and functioning of the gut microbial community remains largely unexplored, yet recent evidence suggests that at least some MGE comprising this fraction of the gut microbiome reflect the co-evolution of host and microbe in the gastro-intestinal tract. In conjunction, the high level of novel gene content typical of MGE coupled with their predicted high diversity, suggests that the mobile metagenome constitutes an immense and largely unexplored gene-space likely to encode many novel activities with potential biotechnological or pharmaceutical value, as well as being important to the development and functioning of the gut microbiota. Of the various types of MGE that comprise the gut mobile metagenome, plasmids are of particular importance since these elements are often capable of autonomous transfer between disparate bacterial species, and are known to encode accessory functions that increase bacterial fitness in a given environment facilitating bacterial adaptation. In this article current knowledge regarding plasmids resident in the human gut mobile metagenome is reviewed, and available strategies to access and characterize this portion of the gut microbiome are described. The relative merits of these methods and their present as well as prospective impact on our understanding of the human gut microbiota is discussed

    Simulating Microdosimetry in a Virtual Hepatic Lobule

    Get PDF
    The liver plays a key role in removing harmful chemicals from the body and is therefore often the first tissue to suffer potentially adverse consequences. To protect public health it is necessary to quantitatively estimate the risk of long-term low dose exposure to environmental pollutants. Animal testing is the primary tool for extrapolating human risk but it is fraught with uncertainty, necessitating novel alternative approaches. Our goal is to integrate in vitro liver experiments with agent-based cellular models to simulate a spatially extended hepatic lobule. Here we describe a graphical model of the sinusoidal network that efficiently simulates portal to centrilobular mass transfer in the hepatic lobule. We analyzed the effects of vascular topology and metabolism on the cell-level distribution following oral exposure to chemicals. The spatial distribution of metabolically inactive chemicals was similar across different vascular networks and a baseline well-mixed compartment. When chemicals were rapidly metabolized, concentration heterogeneity of the parent compound increased across the vascular network. As a result, our spatially extended lobule generated greater variability in dose-dependent cellular responses, in this case apoptosis, than were observed in the classical well-mixed liver or in a parallel tubes model. The mass-balanced graphical approach to modeling the hepatic lobule is computationally efficient for simulating long-term exposure, modular for incorporating complex cellular interactions, and flexible for dealing with evolving tissues

    Determination of no-observed effect level (NOEL)-biomarker equivalents to interpret biomonitoring data for organophosphorus pesticides in children

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Environmental exposure to organophosphorus pesticides has been characterized in various populations, but interpretation of these data from a health risk perspective remains an issue. The current paper proposes biological reference values to help interpret biomonitoring data related to an exposure to organophosphorus pesticides in children for which measurements of alkylphosphate metabolites are available.</p> <p>Methods</p> <p>Published models describing the kinetics of malathion and chlorpyrifos in humans were used to determine no-observed effect level – biomarker equivalents for methylphosphates and ethylphosphates, respectively. These were expressed in the form of cumulative urinary amounts of alkylphosphates over specified time periods corresponding to an absorbed no-observed effect level dose (derived from a published human exposure dose) and assuming various plausible exposure scenarios. Cumulative amounts of methylphosphate and ethylphosphate metabolites measured in the urine of a group of Quebec children were then compared to the proposed biological reference values.</p> <p>Results</p> <p>From a published no-observed effect level dose for malathion and chlorpyrifos, the model predicts corresponding oral biological reference values for methylphosphate and ethylphosphate derivatives of 106 and 52 nmol/kg of body weight, respectively, in 12-h nighttime urine collections, and dermal biological reference values of 40 and 32 nmol/kg of body weight. Out of the 442 available urine samples, only one presented a methylphosphate excretion exceeding the biological reference value established on the basis of a dermal exposure scenario and none of the methylphosphate and ethylphosphate excretion values were above the obtained oral biological reference values, which reflect the main exposure route in children.</p> <p>Conclusion</p> <p>This study is a first step towards the development of biological guidelines for organophophorus pesticides using a toxicokinetic modeling approach, which can be used to provide a health-based interpretation of biomonitoring data in the general population.</p

    Toxicity Testing in the 21st Century: Defining New Risk Assessment Approaches Based on Perturbation of Intracellular Toxicity Pathways

    Get PDF
    The approaches to quantitatively assessing the health risks of chemical exposure have not changed appreciably in the past 50 to 80 years, the focus remaining on high-dose studies that measure adverse outcomes in homogeneous animal populations. This expensive, low-throughput approach relies on conservative extrapolations to relate animal studies to much lower-dose human exposures and is of questionable relevance to predicting risks to humans at their typical low exposures. It makes little use of a mechanistic understanding of the mode of action by which chemicals perturb biological processes in human cells and tissues. An alternative vision, proposed by the U.S. National Research Council (NRC) report Toxicity Testing in the 21st Century: A Vision and a Strategy, called for moving away from traditional high-dose animal studies to an approach based on perturbation of cellular responses using well-designed in vitro assays. Central to this vision are (a) “toxicity pathways” (the innate cellular pathways that may be perturbed by chemicals) and (b) the determination of chemical concentration ranges where those perturbations are likely to be excessive, thereby leading to adverse health effects if present for a prolonged duration in an intact organism. In this paper we briefly review the original NRC report and responses to that report over the past 3 years, and discuss how the change in testing might be achieved in the U.S. and in the European Union (EU). EU initiatives in developing alternatives to animal testing of cosmetic ingredients have run very much in parallel with the NRC report. Moving from current practice to the NRC vision would require using prototype toxicity pathways to develop case studies showing the new vision in action. In this vein, we also discuss how the proposed strategy for toxicity testing might be applied to the toxicity pathways associated with DNA damage and repair

    Read-across and new approach methodologies applied in a 10-step framework for cosmetics safety assessment – A case study with parabens

    Get PDF
    Parabens are esters of para-hydroxybenzoic acid that have been used as preservatives in many types of products for decades including agrochemicals, pharmaceuticals, food and cosmetics. This illustrative case study with propylparaben (PP) demonstrates a 10-step read-across (RAX) framework in practice. It aims at establishing a proof-of-concept for the value added by new approach methodologies (NAMs) in read-across (RAX) for use in a next-generation risk assessment (NGRA) in order to assess consumer safety after exposure to PP-containing cosmetics. In addition to structural and physico-chemical properties, in silico information, toxicogenomics, in vitro toxicodynamic, toxicokinetic data from PBK models, and bioactivity data are used to provide evidence of the chemical and biological similarity of PP and analogues and to establish potency trends for observed effects in vitro. The chemical category under consideration is short (C1–C4) linear chain n-alkyl parabens: methylparaben, ethylparaben, propylparaben and butylparaben. The goal of this case study is to illustrate how a practical framework for RAX can be used to fill a hypothetical data gap for reproductive toxicity of the target chemical PP

    Computational Aspects in Uncertainty Analyses of Physiologically-Based Pharmacokinetic Models

    No full text

    Challenges in the application of quantitative approaches in risk assessment: a case study with di-(2-ethylhexyl)phthalate

    No full text
    corecore